Recurrent Natural Language Parsing1

نویسندگان

  • Stan C. Kwasny
  • Sahnny Johnson
  • Barry L. Kalman
چکیده

A recurrent network was trained from sentence examples to construct symbolic parses of sentence forms. Hundreds of sentences, representing significant syntactic complexity, were formulated and then divided into training and testing sets to evaluate the ability of a recurrent network to learn their structure. The network is shown to generalize well over test sentences and the errors that do remain are found to be of a single type and related to human limitations of sentence processing.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing

In recent studies [1][13][12] Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, data compression is also based on predictions. What the problem comes down to is whether a data compressor could be used to perform as well as recurrent neural networks in natural language processin...

متن کامل

Natural Language Recursion and Recurrent Neural Networks

The recursive structure of natural language was one of the principal, and most telling, sources of diiculty for associationist models of linguistic behaviour. It has, more recently, become a focus in the debate surrounding the generality of neural network models of language, which many would regard as the natural heirs of the associationist legacy. Can neural networks learn to handle recursive ...

متن کامل

Interactive Language Understanding with Multiple Timescale Recurrent Neural Networks

Natural language processing in the human brain is complex and dynamic. Models for understanding, how the brain’s architecture acquires language, need to take into account the temporal dynamics of verbal utterances as well as of action and visual embodied perception. We propose an architecture based on three Multiple Timescale Recurrent Neural Networks (MTRNNs) interlinked in a cell assembly tha...

متن کامل

MuFuRU: The Multi-Function Recurrent Unit

Recurrent neural networks such as the GRU and LSTM found wide adoption in natural language processing and achieve state-ofthe-art results for many tasks. These models are characterized by a memory state that can be written to and read from by applying gated composition operations to the current input and the previous state. However, they only cover a small subset of potentially useful compositi...

متن کامل

Using Recurrent Neural Network for Learning Expressive Ontologies

Recently, Neural Networks have been proven extremely effective in many natural language processing tasks such as sentiment analysis, question answering, or machine translation. Aiming to exploit such advantages in the Ontology Learning process, in this technical report we present a detailed description of a Recurrent Neural Network based system to be used to pursue such goal.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994